Multi-Task Pre-Training of Deep Neural Networks for Digital Pathology
نویسندگان
چکیده
In this work, we investigate multi-task learning as a way of pre-training models for classification tasks in digital pathology. It is motivated by the fact that many small and medium-size datasets have been released community over years whereas there no large scale dataset similar to ImageNet domain. We first assemble transform pathology into pool 22 almost 900k images. Then, propose simple architecture training scheme creating transferable model robust evaluation selection protocol order evaluate our method. Depending on target task, show used feature extractors either improve significantly pre-trained or provide comparable performance. Fine-tuning improves performance extraction able recover lack specificity features, both sources yield
منابع مشابه
Rapid adaptation for deep neural networks through multi-task learning
We propose a novel approach to addressing the adaptation effectiveness issue in parameter adaptation for deep neural network (DNN) based acoustic models for automatic speech recognition by adding one or more small auxiliary output layers modeling broad acoustic units, such as mono-phones or tied-state (often called senone) clusters. In scenarios with a limited amount of available adaptation dat...
متن کاملMulti-task learning deep neural networks for speech feature denoising
Traditional automatic speech recognition (ASR) systems usually get a sharp performance drop when noise presents in speech. To make a robust ASR, we introduce a new model using the multi-task learning deep neural networks (MTL-DNN) to solve the speech denoising task in feature level. In this model, the networks are initialized by pre-training restricted Boltzmann machines (RBM) and fine-tuned by...
متن کاملAn Overview of Multi-Task Learning in Deep Neural Networks
Multi-task learning (MTL) has led to successes in many applications of machine learning, from natural language processing and speech recognition to computer vision and drug discovery. This article aims to give a general overview of MTL, particularly in deep neural networks. It introduces the two most common methods for MTL in Deep Learning, gives an overview of the literature, and discusses rec...
متن کاملMulti-task Deep Neural Networks in Automated Protein Function Prediction
Background: In recent years, deep learning algorithms have outperformed the state-of-the art methods in several areas such as computer vision, speech recognition thanks to the efficient methods for training and for preventing overfitting, advancement in computer hardware and the availability of vast amount data. The high performance of multi-task deep neural networks in drug discovery has attra...
متن کاملAdaptive dropout for training deep neural networks
Recently, it was shown that deep neural networks can perform very well if the activities of hidden units are regularized during learning, e.g, by randomly dropping out 50% of their activities. We describe a method called ‘standout’ in which a binary belief network is overlaid on a neural network and is used to regularize of its hidden units by selectively setting activities to zero. This ‘adapt...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Journal of Biomedical and Health Informatics
سال: 2021
ISSN: ['2168-2208', '2168-2194']
DOI: https://doi.org/10.1109/jbhi.2020.2992878